Evolving Metric Learning for Incremental and Decremental Features
نویسندگان
چکیده
Online metric learning has been widely exploited for large-scale data classification due to the low computational cost. However, amongst online practical scenarios where features are evolving (e.g., some vanished and new augmented), most models cannot be successfully applied these scenarios, although they can tackle instances efficiently. To address challenge, we develop a Evolving Metric Learning (EML) model incremental decremental features, which handle instance feature evolutions simultaneously by incorporating with smoothed Wasserstein distance. Specifically, our contains two essential stages: Transforming stage (T-stage) Inheriting (I-stage). For T-stage, propose extract important information from while neglecting non-informative knowledge, forward it into survived transforming them low-rank discriminative space. It further explores intrinsic structure of heterogeneous samples reduce computation memory burden especially highly-dimensional data. I-stage, inherit performance T-stage then expand include augmented features. Moreover, distance is utilized characterize similarity relationships among complex samples, since not strictly aligned in different stages. In addition tackling challenges one-shot case, also extend multi-shot scenario. After deriving an efficient optimization strategy both extensive experiments on several datasets verify superior EML model.
منابع مشابه
One-Pass Learning with Incremental and Decremental Features
In many real tasks the features are evolving, with some features being vanished and some other features augmented. For example, in environment monitoring some sensors expired whereas some new ones deployed; in mobile game recommendation some games dropped whereas some new ones added. Learning with such incremental and decremental features is crucial but rarely studied, particularly when the dat...
متن کاملIncremental and Decremental Support Vector Machine Learning
An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the KuhnTucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental “unlearning” offers an efficient method to exactly evaluate leave-one-out generalization perfo...
متن کاملIncremental and Decremental Learning for Linear Support Vector Machines
We present a method to find the exact maximal margin hyperplane for linear Support Vector Machines when a new (existing) component is added (removed) to (from) the inner product. The maximal margin hyperplane with the new inner product is obtained in terms of that for the old inner product, without re-computing it from scratch and the procedure is reversible. An algorithm to implement the propo...
متن کاملEvolving Classifiers: Methods for Incremental Learning
The ability of a classifier to take on new information and classes by evolving the classifier without it having to be fully retrained is known as incremental learning. Incremental learning has been successfully applied to many classification problems, where the data is changing and is not all available at once. In this paper there is a comparison between Learn++, which is one of the most recent...
متن کاملIncremental and Decremental Exponential Discriminant Analysis for Face Recognition
Linear Discriminant Analysis (LDA) is widely used for feature extraction in face recognition but suffers from small sample size (SSS) problem in its original formulation. Exponential discriminant analysis (EDA) is one of the variants of LDA suggested recently to overcome this problem. For many real time systems, it may not be feasible to have all the data samples in advance before the actual mo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Circuits and Systems for Video Technology
سال: 2021
ISSN: ['1051-8215', '1558-2205']
DOI: https://doi.org/10.1109/tcsvt.2021.3093953